πŸ““ Artificial Intelligence/Introduction to AI/Week 3 - Introduction/Definitions/Inter-Rater_Agreement.md by @KGBicheno β˜†

inter-rater agreement

Go back to the [[AI Glossary]]

A measurement of how often human raters agree when doing a task. If raters disagree, the task instructions may need to be improved. Also sometimes called inter-annotator agreement or inter-rater reliability. See also Cohen’s kappa, which is one of the most popular inter-rater agreement measurements.